Nearly Uniform Validation Improves Compression-Based Error Bounds

نویسنده

  • Eric Bax
چکیده

This paper develops bounds on out-of-sample error rates for support vector machines (SVMs). The bounds are based on the numbers of support vectors in the SVMs rather than on VC dimension. The bounds developed here improve on support vector counting bounds derived using Littlestone and Warmuth’s compression-based bounding technique.

برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

ثبت نام

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

منابع مشابه

Visual Pattern Image Coding by a Morphological Approach (RESEARCH NOTE)

This paper presents an improvement of the Visual Pattern image coding (VPIC) scheme presented by Chen and Bovik in [2] and [3]. The patterns in this improved scheme are defined by morphological operations and classified by absolute error minimization. The improved scheme identifies more uniform blocks and reduces the noise effect. Therefore, it improves the compression ratio and image quality i...

متن کامل

Improved Uniform Test Error Bounds

We derive distribution free uniform test error bounds that improve on VC type bounds for validation We show how to use knowledge of test inputs to improve the bounds The bounds are sharp but they require intense computation We introduce a method to trade sharpness for speed of computation Also we compute the bounds for several test cases Key wordsmachine learning learning theory generalization ...

متن کامل

Near-Optimal Bounds for Cross-Validation via Loss Stability

Multi-fold cross-validation is an established practice to estimate the error rate of a learning algorithm. Quantifying the variance reduction gains due to cross-validation has been challenging due to the inherent correlations introduced by the folds. In this work we introduce a new and weak measure called loss stability and relate the cross-validation performance to this measure; we also establ...

متن کامل

(67577) Introduction to Machine Learning Lecture 1 – Introduction and Gentle Start Lecture 2 – Bias Complexity Tradeoff Lecture 3(a) – Mdl Lecture 3(b) – Validation Lecture 3(c) – Compression Bounds

In the previous lectures we saw how to express prior knowledge by restricting ourselves to a finite hypothesis classes or by defining an order over countable hypothesis classes. In this lecture we show how one can learn even uncountable hypothesis classes by deriving compression bounds. Roughly speaking, we shall see that if a learning algorithm can express the output hypothesis using a small s...

متن کامل

Error Bounds for Transductive Learning via Compression and Clustering

This paper is concerned with transductive learning. Although transduction appears to be an easier task than induction, there have not been many provably useful algorithms and bounds for transduction. We present explicit error bounds for transduction and derive a general technique for devising bounds within this setting. The technique is applied to derive error bounds for compression schemes suc...

متن کامل

ذخیره در منابع من


  با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید

برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

ثبت نام

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

عنوان ژورنال:
  • Journal of Machine Learning Research

دوره 9  شماره 

صفحات  -

تاریخ انتشار 2008